Home |
| Latest | About | Random
# 19b More examples of linear independence. First let us remark: **Remark.** Sometimes we will say the list $v_{1},\ldots,v_{k}$ is linearly independent (resp. linearly dependent) to mean the set $\{\vec v_{1},\ldots,\vec v_{k}\}$ is linearly independent (resp. linearly dependent). **Remark.** We can only use the adjective linear independent or linearly dependent to a lists or sets of vectors or objects of the "same kind" -- those that can be scaled and added together (namely those that you can take linear combinations of). **Remark.** In short, determining whether $v_{1},\ldots,v_{k}$ are linearly independent or linearly dependent, is to examine whether the equation $c_{1}v_{1}+c_{2} v_{2}+\cdots +c_{k}v_{k}= \mathbf 0$ has just only the trivial solution, or admits a nontrivial solution. Here $\mathbf 0$ denotes the corresponding "zero object" for this linear combination, whatever it may be. **Example.** Determine whether the set of vectors $\{\begin{bmatrix}1\\2\end{bmatrix},\begin{bmatrix}2\\3\end{bmatrix},\begin{bmatrix}3\\4\end{bmatrix}\}$ is linearly independent or linearly dependent. And if linearly dependent, show an explicit linear dependence relation. $\blacktriangleright$ We need to examine whether the equation $$ c_{1}\begin{bmatrix}1\\2\end{bmatrix}+c_{2}\begin{bmatrix}2\\3\end{bmatrix}+c_{3}\begin{bmatrix}3\\4\end{bmatrix}=\vec 0 $$has just the trivial solution, or admits a nontrivial solution. This amounts to showing a certain homogeneous system $A\vec x=\vec 0$ has unique solution or multiple solutions. We set up the relevant augmented matrix and row reduce: $$ \begin{bmatrix} 1 & 2 & 3 & \vdots & 0 \\ 2 & 3 & 4 & \vdots & 0 \end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix} 1 & 2 & 3 & \vdots & 0 \\ 0 & -1 & -2 & \vdots & 0 \end{bmatrix} $$since this system has a free variable, we know there are nontrivial solutions. Hence the equation $$ c_{1}\begin{bmatrix}1\\2\end{bmatrix}+c_{2}\begin{bmatrix}2\\3\end{bmatrix}+c_{3}\begin{bmatrix}3\\4\end{bmatrix}=\vec 0 $$**has nontrivial solutions**, namely the set of vectors $\{\begin{bmatrix}1\\2\end{bmatrix},\begin{bmatrix}2\\3\end{bmatrix},\begin{bmatrix}3\\4\end{bmatrix}\}$ is **linearly dependent**. And to show such a linear dependence relation, note we can solve the coefficients where $c_{3}$ is free and $c_{2}=-2c_{3}$ and $c_{1}=-2c_{2}-3c_{3}$. Since we just want one such linear dependence relation, we can pick $c_{3}=1$ to get $c_{2}=-2,c_{1}=1$, giving a nontrivial linear dependence relation $$ \begin{bmatrix}1\\2\end{bmatrix}-2\begin{bmatrix}2\\3\end{bmatrix}+\begin{bmatrix}3\\4\end{bmatrix}=\vec 0. \quad\blacklozenge $$ ## Intuition of linear independence / dependence with a one or two vectors. Let us recall what it means for a set of **one vector** to be linearly independent or dependent. If we have just $\{\vec v\}$, we need to ask when is $c\vec v=\vec 0$. If $\vec v\neq \vec 0$, then this can only happen if $c=0$. So if $\vec v\neq \vec 0$, then the set $\{\vec v\}$ is linearly independent. However, if $\vec v=\vec 0$ itself, we have the nontrivial linear dependence relation $1\vec v=\vec 0$. So if $\vec v=\vec 0$, then the set $\{\vec v\}$ is linearly dependent. What does it mean for a set of **two vectors** to be linearly independent? Assume they are from the same $\mathbb{R}^{n}$ space. Let us take the set $\{\vec v_{1},\vec v_{2}\}\subset \mathbb{R}^{n}$. To examine linear independence / dependence we look at the equation $$ c_{1}\vec v_{1} + c_{2} \vec v_{2} = \vec 0 \iff c_{1}v_{1} = -c_{2} v_{2}. $$**If they are linearly independent**, then we must have $c_{1}=c_{2}=0$. **This implies the vectors cannot be a scalar multiple of the other**. Indeed, without loss of generality, if to the contrary that $\vec v_{1} = \lambda \vec v_{2}$, say, then $1 \vec v_{1}-\lambda\vec v_{2}=\vec 0$ is a nontrivial linear dependence relation, a contradiction. **If they are linearly dependent**, then we have some nontrivial linear dependence relation, and $c_{1},c_{2}$ cannot both be zero. Without loss of generality, say $c_{1}\neq 0$. Then $\vec v_{1}=-\frac{c_{2}}{c_{1}}\vec v_{2}$. This implies **one of the vector is a scalar multiple of the other**. (Note this scalar multiple can be 0.) So for a set of two vectors we have proved the following simple characterization: > **Proposition.** Suppose we have $\{\vec v_{1},\vec v_{2}\}\subset \mathbb{R}^{n}$. Then $\{\vec v_{1},\vec v_{2}\}$ is linearly dependent if one of the vectors is a scalar multiple of the other. And $\{\vec v_{1},\vec v_{2}\}$ is linearly independent if neither vector can be written as a scalar multiple of the other. ## More abstract examples. Here is a more abstract example of using these definitions. **Example.** Suppose $\vec a$ and $\vec b$ are linearly independent (say both are vectors form $\mathbb{R}^{n}$, some $n$), prove $\vec a$ and $\vec a+\vec b$ are linearly independent. $\blacktriangleright$ Proof. Suppose $\vec a$ and $\vec b$ are linearly independent. This means the equation $c_{1}\vec a+c_{2}\vec b=\vec 0$ has only the trivial solution $c_{1}=c_{2}=0$. To examine the linearly independence of $\vec a$ and $\vec a+\vec b$ , let us consider the equation $$ t_{1}\vec a + t_{2}(\vec a+\vec b)=\vec 0 $$for some $t_{1},t_{2}$. Note we can re-write above as $$ (t_{1}+t_{2})\vec a + t_{2}\vec b=\vec 0, $$and as $\vec a$ and $\vec b$ are linearly independent, we must have $t_{1}+t_{2}=0$ and $t_{2}=0$. This means $t_{1}=0$ and $t_{2}=0$, which means $\vec a$ and $\vec a+\vec b$ are linearly independent. $\blacksquare$ You need to try to understand this argument. Once you do, your third eye is opening to the idea of linear independence! And since we can take linear combination of things that are not just column vectors, we can also inquire the linear independence of those objects. **Example.** Is the set of polynomials $\{x+1, x+2,x+3\}$ linearly independent or linearly dependent? If linearly dependent, find an explicit nontrivial linear combination of them to make $0$. $\blacktriangleright$ To determine this we set up the equation $$ c_{1}(x+1)+c_{2}(x+2)+c_{3}(x+3) = 0 $$and check how many solutions it has for $c_{1},c_{2},c_{3}$. Note by matching powers of $x$, we have a system of equation $$ \begin{align*} c_{1}+c_{2}+c_{3}=0 \\ c_{1}+2c_{2}+3c_{3}=0 \end{align*} $$which we have augmented matrix $$ \begin{bmatrix}1 & 1 & 1 & \vdots & 0 \\ 1 & 2 & 3 & \vdots & 0\end{bmatrix} \stackrel{\text{row}}\sim \begin{bmatrix}1 & 1 & 1 & \vdots & 0 \\ 0 & 1 & 2 & \vdots & 0\end{bmatrix} $$which we see has infinitely many solutions, as $c_{3}$ is a free variable. To find an explicit linear combination, we can set $c_{3}=1$, and note that $c_{2}+2c_{3}=0$ and $c_{1}+c_{2}+c_{3}=0$. So this gives $c_{2}=-2$ and $c_{1}=1$. Hence we have nontrivial linear combination for $0$ given by $$ 1(x+1)-2(x+2)+(x+3)=0. \quad\blacklozenge $$ Again, we will continue to speak more of linear independence and linear dependence. Along with linear combinations and span, these are central ideas in linear algebra!